4 research outputs found
Geometric Methods for Spherical Data, with Applications to Cosmology
This survey is devoted to recent developments in the statistical analysis of
spherical data, with a view to applications in Cosmology. We will start from a
brief discussion of Cosmological questions and motivations, arguing that most
Cosmological observables are spherical random fields. Then, we will introduce
some mathematical background on spherical random fields, including spectral
representations and the construction of needlet and wavelet frames. We will
then focus on some specific issues, including tools and algorithms for map
reconstruction (\textit{i.e.}, separating the different physical components
which contribute to the observed field), geometric tools for testing the
assumptions of Gaussianity and isotropy, and multiple testing methods to detect
contamination in the field due to point sources. Although these tools are
introduced in the Cosmological context, they can be applied to other situations
dealing with spherical data. Finally, we will discuss more recent and
challenging issues such as the analysis of polarization data, which can be
viewed as realizations of random fields taking values in spin fiber bundles.Comment: 25 pages, 6 figure
Needlet Multiple Testing for Point Source Detection in Planck CMB Anisotropy Maps
The Cosmic Microwave Background is one of the most important sources of information about the Early Universe, since it is composed by photons emitted when the Universe was only 377 000 years old. Observing it, however, entails several difficulties. One of them is the presence of unresolved point sources in the foreground. In this work, we implement a recent algorithm that takes advantage of needlet filtering and multiple testing Statistics. We create a flexible and complete programme and apply it to simulations and Planck anisotropy maps. In the simulations, the algorithm is able to recover essentially all point sources with an intensity above 3 to 4 with very low percentage of false detections. On some of the Planck maps, the algorithm detects a population of maxima incompatible with the assumed gaussianity of the Cosmic Microwave Background.Abweichender Titel laut Übersetzung der Verfasserin/des VerfassersArbeit an der Bibliothek noch nicht eingelangt - Daten nicht geprüftInnsbruck, Univ., Masterarb., 2018(VLID)286224
A novel Cosmic Filament catalogue from SDSS data
In this work we present a new catalogue of Cosmic Filaments obtained from the
latest Sloan Digital Sky Survey (SDSS) public data. In order to detect
filaments, we implement a version of the Subspace-Constrained Mean-Shift
algorithm, boosted by Machine Learning techniques. This allows us to detect
cosmic filaments as one-dimensional maxima in the galaxy density distribution.
Our filament catalogue uses the cosmological sample of SDSS, including Data
Release 16, so it inherits its sky footprint (aside from small border effects)
and redshift coverage. In particular, this means that, taking advantage of the
quasar sample, our filament reconstruction covers redshifts up to ,
making it one of the deepest filament reconstructions to our knowledge. We
follow a tomographic approach and slice the galaxy data in 269 shells at
different redshift. The reconstruction algorithm is applied to 2D spherical
maps. The catalogue provides the position and uncertainty of each detection for
each redshift slice. We assess the quality of the detections with several
metrics, which show improvement with respect to previous public catalogues
obtained with similar methods. We also detect a highly significant correlation
between our filament catalogue and galaxy cluster catalogues built from
microwave observations of the Planck Satellite and the Atacama Cosmology
Telescope.Comment: 23 pages, 20 figures, version accepted for publication in A&
Point Source Detection and False Discovery Rate Control on CMB Maps
We discuss a new procedure to search for point sources in Cosmic Microwave
background maps; in particular, we aim at controlling the so-called False
Discovery Rate, which is defined as the expected value of false discoveries
among pixels which are labelled as contaminated by point sources. We exploit a
procedure called STEM, which is based on the following four steps: 1) needlet
filtering of the observed CMB maps, to improve the signal to noise ratio; 2)
selection of candidate peaks, i.e., the local maxima of filtered maps; 3)
computation of \emph{p-}values for local maxima; 4) implementation of the
multiple testing procedure, by means of the so-called Benjamini-Hochberg
method. Our procedures are also implemented on the latest release of Planck CMB
maps